-
Notifications
You must be signed in to change notification settings - Fork 154
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add samples for log prob and search grounding #273
Conversation
samples/log_prob.js
Outdated
* limitations under the License. | ||
*/ | ||
|
||
import { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: I'd leave these imports on a single line, unless they get longer than 80 characters. That's what I usually see in JS.
samples/log_prob.js
Outdated
}, | ||
}, | ||
); | ||
const prompt = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably don't need to split this across two lines.
samples/search_grouding.js
Outdated
); | ||
|
||
const prompt = | ||
"What is the Google stock today?"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"What is the Google stock today?" -> "What is the price of Google stock today?"
samples/search_grouding.js
Outdated
"What is the Google stock today?"; | ||
|
||
const result = await model.generateContent(prompt); | ||
console.log(JSON.stringify(result.response)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You shouldn't need to use JSON.stringify
here. Just console.log(result.response)
;
Fixed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(1) does this match the samples being done for Python and other languages? When we first added samples to the repo, I was given the impression it was really important that all the samples match across languages as much as possible - especially the includecode region tags. Also the content of the prompts, etc.
(2) Have you run these samples against the live backend, do they give the expected output?
|
||
async function enableLogProb() { | ||
// [START log probability] | ||
const genAI = new GoogleGenerativeAI(process.env.API_KEY); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For devsite snippets we add a comment like this about what imports the user will need, since the import at the top of the file will not be included in the includecode snippet: https://github.com/google-gemini/generative-ai-js/blob/add-samples/samples/text_generation.js#L43-L44
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see this is resolved, but I don't see the comment added?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
samples/search_grouding.js
Outdated
const genAI = new GoogleGenerativeAI(process.env.API_KEY); | ||
const model = genAI.getGenerativeModel( | ||
{ | ||
model: "gemini-1.5-pro", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason for choosing different models for each sample?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see the two models are now flash and flash-002, is there a reason they are different?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because the logprob is only supported for 002 model. A bit background: Labs are calling vertex for all 002 models. And logprob is a feature only supported by vertex for now.
(1) I don't see python has example of search grounding or log probs. |
Add import comment to new samples. |
Please help review new samples.